Preventing a dropout

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Preventing Student Dropout in Distance Learning Using Machine Learning Techniques

Student dropout occurs quite often in universities providing distance education. The scope of this research is to study whether the usage of machine learning techniques can be useful in dealing with this problem. Subsequently, an attempt was made to identifying the most appropriate learning algorithm for the prediction of students' dropout. A number of experiments have taken place with data pro...

متن کامل

A Bayesian encourages dropout

Dropout is one of the key techniques to prevent the learning from overfitting. It is explained that dropout works as a kind of modified L2 regularization. Here, we shed light on the dropout from Bayesian standpoint. Bayesian interpretation enables us to optimize the dropout rate, which is beneficial for learning of weight parameters and prediction after learning. The experiment result also enco...

متن کامل

Understanding Dropout

Dropout is a relatively new algorithm for training neural networks which relies on stochastically “dropping out” neurons during training in order to avoid the co-adaptation of feature detectors. We introduce a general formalism for studying dropout on either units or connections, with arbitrary probability values, and use it to analyze the averaging and regularizing properties of dropout in bot...

متن کامل

Concrete Dropout

• Gal and Gharamani (2015) reinterpreted dropout regularisation as approximate inference in BNNs •Dropout probabilities pl are variational parameters of the approximate posterior qθ(ω) = ∏ k qMk,pk(Wk), where Wk = Mk · diag (zk) and zkl iid ∼Bernoulli(1− pk) • Concrete distribution (Maddison et al., Jang et al.) relaxes Categorical distribution to obtain gradients wrt the probability vector – E...

متن کامل

Fraternal Dropout

Recurrent neural networks (RNNs) form an important class of architectures among neural networks useful for language modeling and sequential prediction. However, optimizing RNNs is known to be harder compared to feed-forward neural networks. A number of techniques have been proposed in literature to address this problem. In this paper we propose a simple technique called fraternal dropout that t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: British Dental Journal

سال: 2007

ISSN: 0007-0610,1476-5373

DOI: 10.1038/bdj.2007.132